Goto

Collaborating Authors

 Lancaster County


Uncertainty Quantification Via the Posterior Predictive Variance

Chaudhuri, Sanjay, Dustin, Dean, Clarke, Bertrand

arXiv.org Machine Learning

Abstract: We use the law of total variance to generate multiple expansions for the posterior predictive variance. These expansions are sums of terms involving conditional expectations and conditional variances and provide a quantification of the sources of predictive uncertainty. Since the posterior predictive variance is fixed given the model, it represents a constant quantity that is conserved over these expansions. The terms in the expansions can be assessed in absolute or relative sense to understand the main contributors to the length of prediction intervals. We quantify the term-wise uncertainty across expansions varying in the number of terms and the order of conditionates. In particular, given that a specific term in one expansion is small or zero, we identify the other terms in other expansions that must also be small or zero. We illustrate this approach to predictive model assessment in several well-known models. The Setting and Intuition Everyone uses prediction intervals (PI's) but few examine their structure or more precisely how they should be interpreted in the context of a model with multiple components. Often PI's seem overconfident (too narrow) or useless (too wide). Both frequentist and Bayesian practitioners routinely report PI's.



Psychologists made people look at spiders. They didn't like it.

Popular Science

Environment Animals Wildlife Spiders Psychologists made people look at spiders. Humans will try to focus on almost anything else. Breakthroughs, discoveries, and DIY tips sent six days a week. There are plenty of studies examining why humans are so hardwired to detest spiders . However, fewer researchers have spent time investigating just far we'll go to avoid even looking at them.At the University of Nebraska-Lincoln, psychologists decided to find out for themselves.